Search Results: "duck"

9 August 2015

Simon Kainz: DUCK challenge: week 5

Slighthly delayed, but here are the stats for week 5 of the DUCK challenge: So we had 10 packages fixed and uploaded by 10 different uploaders. A big "Thank You" to you!! Since the start of this challenge, a total of 59 packages, were fixed. Here is a quick overview:
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7
# Packages 10 15 10 14 10 - -
Total 10 25 35 49 59 - -
The list of the fixed and updated packages is availabe here. I will try to update this ~daily. If I missed one of your uploads, please drop me a line. Only 2 more weeks to DebConf15 so please get involved: The DUCK Challenge is running until end of DebConf15! Pevious articles are here: Week 1, Week 2, Week 3, Week 4.

31 July 2015

Simon Kainz: DUCK challenge: week 4

The DUCK challenge is making a quite stable progress: in the last 4 weeks there were approximately 12.25 packages fixed and uploaded per week. In the current week the following packages were fixed and uploaded into unstable: So we had 14 packages fixed and uploaded by 10 different uploaders. A big "Thank You" to you!! Since the start of this challenge, a total of 49 packages, uploaded by 31 different persons were fixed. Here is a quick overview:
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7
# Packages 10 15 10 14 - - -
Total 10 25 35 49 - - -
The list of the fixed and updated packages is availabe here. I will try to update this ~daily. If I missed one of your uploads, please drop me a line. DebConf15 is approaching quite fast, so please get involved: The DUCK Challenge is running until end of DebConf15! Pevious articles are here: Week 1, Week 2, Week 3.

24 July 2015

Simon Kainz: DUCK challenge: week 3

One more update on the the DUCK challenge: In the current week, the following packages were fixed and uploaded into unstable: So we had 10 packages fixed and uploaded by 8 different uploaders. A big "Thank You" to you!! Since the start of this challenge, a total of 35 packages, uploaded by 25 different persons were fixed. Here is a quick overview:
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7
# Packages 10 15 10 - - - -
Total 10 25 35 - - - -
The list of the fixed and updated packages is availabe here. I will try to update this ~daily. If I missed one of your uploads, please drop me a line. There is still lots of time till the end of DebConf15 and the end of the DUCK Challenge, so please get involved. Pevious articles are here: Week 1, Week 2.

17 July 2015

Simon Kainz: DUCK challenge: week 2

Just a litte update on the DUCK challenge: In the last week, the following packages were fixed and uploaded into unstable: Last week we had 10 packages uploaded & fixed, the current week resulted in 15 fixed packages. So there are currently 25 packages fixed by 20 different uploaders. I really hope i can meet you all at DebConf15!! The list of the fixed and updated packages is availabe here. I will try to update this ~daily. If I missed one of your uploads, please drop me a line. A big "Thank You" to you. There is still lots of time till the end of DebConf15 and the end of the DUCK Challenge, so please get involved. And rememeber: debcheckout fails? FIX MORE URLS

9 July 2015

Simon Kainz: DUCK challenge: week 1

After announcing the DUCK challenge last week the following packages were fixed and uploaded into unstable: A big "Thank You" to you. The list of the fixed and updated packages is availabe here. I will try to update this ~daily. If I missed one of your uploads, please drop me a line. There is still lots of time till the end of DebConf15 and the end of the DUCK Challenge, so please get involved.

2 July 2015

Simon Kainz: DUCK challenge at DebConf15

New features in DUCK Carnivore-* data DUCK now uses carnivore- names,email tables from UDD, giving a nice list of packages grouped by Maintainer/Uploader names. Domain grouping A per-domain-listing is now also available here. DUCK challenge at DebConf15 After announcing DUCK in mid-june 2012, the number of source packages with issues is still somewhat stable around 1700. After a recent update of the curl libs, i also managed to get rid of 200 false positives, caused by SSL-verification issues, as can be seen here. To speed things up a bit and lower the number of broken links, i hereby propose the following challenge: The first 99 persons who fix at least 1 broken URL and upload the fixed package before end of DebConf15 will get an awesome "200 OK" DUCK-branded lighter at DebConf15! Lighter Army of Lighters The challenge starts right now! I will try hard to not forget anyone who fixes packages (note the s ;-), but if you feel missed out, please contact me at DC15. Also, please remember that this is not a valid excuse to NMU packages ;-). su

5 May 2015

Paul Wise: The #newinjessie game: developer & QA tools

Continuing the #newinjessie game: There are a number of development and QA tools that are new in jessie:

4 April 2015

Elena 'valhalla' Grandi: Leap second on 31 march

Leap second on 31 march

A couple of days ago this appeared in my system logs


Mar 31 23:59:59 kernel: Clock: inserting leap second 23:59:60 UTC


my first reaction of course was "great! they gave us one second more of sleep! MY PRECIOUSSSS", but then I realized that yes, this year there was supposed to be a leap second, but it should have been in June, not in March.

Other people I know noticed the message, but nobody knew anything else about it, and duckduckgoing didn't find anything, so I'm asking the lazyweb: does anybody know what happened?

Update: it seems that this has been traced to a single layer1 ntp server.

12 February 2015

Sven Hoexter: Out of the comfort zone: What I learned from my net-snmp backport on CentOS 5 and 6

This is a short roundup of things I learned after I've rolled out the stuff I wrote about here. Bugs and fighting with multiarch rpm/yum One oversight led me to not special case the perl-devel dependency on the net-snmp-devel package for CentOS 5, to depend on perl instead. That was easily fixed but afterwards a yum install net-snmp-devel still failed because it tried to install the stock CentOS 5 net-snmp-devel package and its dependencies. Closer investigation showed that it did so because I only provided x86_64 packages in our internal repository but it wanted to install i386 and x86_64 packages. Looking around this issue is documented in the Fedora Wiki. So the modern way to deal with it is to make the dependency architecture dependend with the
% ?_isa 
macro. That is evaluated at build time of the src.rpm and then the depedency is hardcoded together with the architecture. Compare
$ rpm -qRp net-snmp-devel-5.7.2-18.el6.x86_64.rpm grep perl
perl-devel(x86-64)
to
$ rpm -qRp net-snmp-devel-5.7.2-19.el5.centos.x86_64.rpm  grep perl
perl
As you can see it's missing for the el5 package and that's because it's too old, or more specific, rpm is too old to know about it. The workaround we use for now is to explicitly install only the x86_64 version of net-snmp-devel on CentOS 5 when required. That's a
yum install net-snmp-devel.x86_64
Another possible workaround is to remove i386 from your repository or just blacklist it in your yum configuration. I've read that's possible but did not try it. steal time still missing on CentOS 6 One of the motivations for this backport is support for steal time reporting via SNMP. For CentOS 5 that's solved with our net-snmp backport but for CentOS 6 we still do not see steal time. A short look around showed that it's also missing in top, vmstat and friends. Could it be a kernel issue? Since we're already running on the CentOS Xen hypervisor we gave the Linux 3.10.x packages a spin in a domU and now we also have steal time reporting. Yes, a kernel issue with the stock CentOS 6/RHEL 6 kernel. I'm not sure where and how to fill it since RedHat moved to KVM.

30 January 2015

Laura Arjona: Going selfhosting: Installing Debian Wheezy in my home server

It was in my mind to open a new series of articles with topic selfhosting , because I really believe in free software based network services and since long time I want to plug a machine 24 7 at home to host my blog, microblog, MediaGoblin, XMPP server, mail, and, in conclusion, all the services that now I trust to very kind third parties that run them with free software, but I know I could run myself (and offer them to my family and friends). Last September I bought the domain larjona.net (curious, they say buy but it s a rent, for 1,2,3 years never yours. Pending another post about my adventures with the domain name, dynamic DNS, and SSL certs!) and I bought an HP Microserver G7 N54L, with 2 GB RAM. It had a 250GB SATA harddisk and I bought 2 more SATA harddisks, 1 TB each, to setup a RAID 1 (mirror). Total cost (with keyboard and mouse), 300 . A friend gave me a TFT monitor that was too old for him (1024 768) but it serves me well, (it s a server, no graphical interface, and I will connect remotely most of the times). Installing Debian stable (wheezy) I decided to install Debian stable. Jessie was not frozen yet, and since it was my first non-LAMP server install, I wanted to make sure that errors and problems would be my errors, not issues of the non-released-yet distro. I thought to install YunoHost or some other distro prepared for selfhosting, but I ve never tried them, and I have not much free time, so I decided to stick on Debian, my beloved distro, because it s the one that I know best and I m part of its awesome community. And maybe I could contribute back some bug reports or documentation. I wanted to try a crypto setup (just for fun, just for learn, for its benefits, and to be one more freecrypto-tester in the world) so after reading a bit: https://wiki.debian.org/DebianInstaller/SataRaid
https://wiki.archlinux.org/index.php/disk_encryption
http://madduck.net/docs/cryptdisk/
http://linuxgazette.net/140/kapil.html
http://smcv.pseudorandom.co.uk/2008/09/cryptroot/
http://www.linuxquestions.org/questions/linux-security-4/lvm-before-and-after-encryption-871379/ and some other pages, and try some different things, this is the setup that I managed to configure: Everything went well. Yay! Some doubts and one problem Everything went quite well except some doubts: After talking about this issues with friends (and in debian-women IRC channel), I decided to install the non-free driver, just in case, with the same reasoning as with the RAID: let the card do the job, so the CPU can care about other things. Again, I notice that learning a bit about benchmarking (and having some time to do some tests) would be nice And now, the problem: I left this problem apart and go on installing the software. I would think later what to do. Installing MediaGoblin The most urgent selfhosting service, for me, was GNU MediaGoblin, because I wanted to show my server to my family in Christmas, and upload the pictures of the babies and kids of the family. And it s a project where I contribute translations and I am a big fan, so I would be very proud of hosting my own instance. I followed the documentation to setup 2 instances of GNU MediaGoblin 0.7 (the stable release in the moment), with their corresponding PostgreSQL databases. Why two instances? Well, I want an instance to host and show my videos, images, and replicate videos that I like, and a private one for sharing photos and videos with my family. MediaGoblin has no privacy settings yet, so I installed separate instances, and the private one I put it in a different port, with a self-signed SSL cert, and enabled http-authorization in Nginx, so only authorized Linux users of my machine can accesss the website. Installing MediaGoblin was easier than what I thought. I only had some small doubts about the documentation, and they were solved in the IRC channel. You can access, for example, my user profile in my public instance, and see some different files that I already uploaded. I m very happy!! Face to face with the bug, again I had to solve the problem of the password not accepted in reboots. I began to think that it could be a bug in cryptsetup. Should I upgrade the package to the version in wheezy-backports? Jessie was almost frozen, maybe it was time to upgrade the whole system, to see if the problem was solved (and to see how my MediaGoblin was working or not in Jessie. It should work, it s almost packaged! But who knows). And if it didn t work, maybe it was time to file a bug So I upgraded my system to Debian Jessie. And after upgrade, the system didn t boot. But that s the story of another blog post (that I still need to finish to write don t worry, it has happy end, as you could see accessing my Mediagoblin site!). Comments? You can comment in this pump.io thread.
Filed under: My experiences and opinion Tagged: Debian, encryption, English, libre software, MediaGoblin, Moving into free software, N54L, selfhosting, sysadmin

4 January 2015

Russ Allbery: Review: Programming Ruby

Review: Programming Ruby, by Dave Thomas, et al.
Publisher: Pragmatic Bookshelf
Copyright: 2005
Printing: May 2006
ISBN: 0-9745140-5-5
Format: Trade paperback
Pages: 785
There are a few different editions of this book. The version I read is the second edition, written by Dave Thomas with Chad Fowler and Andy Hunt and published in 2005, covering Ruby 1.8.2. There's now a fourth edition, covering Ruby 1.9 and 2.0, which is probably what you'd want if you were buying this book today. This book, in whatever edition, is called the Pickaxe in the Ruby community after its cover. I've used a lot of different programming languages, so I can usually pick one up on the fly reasonably well, but I still like to read a good introductory book before using one seriously. It's a bit too easy to get lost or to fall into habits that don't match the best practices of the language community without a solid introduction. I've been using a bit of Ruby off and on since I started using Puppet, but I'm looking at doing more serious development using Chef, so I decided it was time to get that introduction. (It helped that I had this book sitting around, although that's also why I read an older edition.) Programming Ruby starts with the obligatory introduction to installing and running Ruby, and then provides a high-level introduction to the language and its basic types just enough to make Ruby comprehensible before starting into the object system. Everything is an object in Ruby, so the book introduces the object system as early as possible, and then shows the rest of the language from constants up in the light of that object system. The rest of part one follows the normal language introduction path, building up from constants and methods to exceptions, modules, and basic IO. It closes with chapters about threads and processes, unit testing, and the debugger. Part two is a grab-bag of one-chapter topics describing how to use Ruby in a particular setting, or showing one angle of the language. The best of those chapters for me was the one on RDoc, partly because I'm quite impressed by Ruby's documentation system. A few of these chapters are oddly in-depth for an introductory book I doubt I'm ever going to use all the details about special irb configuration, and if I do, I'd just look them up but I greatly appreciated the solid chapter on how to write Ruby extensions in C. There is also the obligatory chapter on writing GUI applications with Tk, which always seems to show up in these sorts of introductions and which always baffles me. Does anyone actually do this any more instead of writing a web application? Part three dives back into the language and provides a more complete and formal description. The authors aren't afraid to get into some of the internals, which I appreciated. There is a good chapter here on the details of the type system and how objects and classes interact, and a much-needed extended discussion of duck typing. This type of weak typing and runtime binding is fundamental to how Ruby approaches objects, for better or worse. (I have mixed opinions; it makes some things easier, but I increasingly appreciate strong typing and more formal interface definitions.) Some discussion of marshalling and introspection closes out the discussion portion of the book. That's about 420 pages of the material. The rest of the book is a detailed reference on all of the core classes, and a quicker overview of the standard library. Normally, this sort of thing is thrown into language introductions to pad out the page count, but usually the language's official documentation is better at this sort of reference. But I found Programming Ruby to be an exception. The reference is succinct, sticking to a paragraph or two for each method, and did a great job of providing enough cross-reference and discussion to put each class into a broader perspective. It's the most useful example of this type of reference section I've seen. I still probably won't use it after this initial reading, but I think I got a better feel for the language from reading through it. It's hard to review a book like this without reviewing the language it documents, at least a little bit. I'll indulge: it entertains me how much Ruby is obviously based on Perl, including borrowing some of Perl's more dubious ideas. The global punctuation variables will look familiar to any Perl programmer, and the oddly-named global variables for the interpreter flags are in the same spirit. The language unfortunately has similar problems as Perl with safely running commands without using the shell; it's possible, but not the default and not what the built-ins do. There are places where I wish Ruby were a little less like Perl. The plus side for an experienced Perl programmer is that Ruby feels quite familiar and has made some clear improvements. The ? and ! convention for methods that return booleans or modify objects in-place is brilliant in its simplicity, and something I'd love to see in more languages. And the way Ruby implements ubiquitous code blocks for both iterators and for any temporary objects is lovely once one gets used to it. It's similar to Python's context managers, except more general and built deeper into the language. Returning to the review of the book, rather than the topic, Programming Ruby has a good, clear explanation of blocks, iterators, and yield. If you're interested in getting a grounding in Ruby, this book still feels like a solid introduction. The edition I read is getting a bit long in the tooth now that we're on Ruby 2.1, but the pace of language change has slowed, and most of the book is still applicable. (If you're buying it new, you should, of course, get the later edition.) The table of contents makes it seem like the book is covering the same ground multiple times, but that organizational strategy worked better than I expected. Ruby is not the most organized language in the world, so I still felt a bit overwhelmed with random method names in places, but I never felt lost in the mechanics of the language. In short, recommended if you want a good introduction to the language, although probably in a later edition. Rating: 8 out of 10

1 January 2015

Russ Allbery: Review: Three Parts Dead

Review: Three Parts Dead, by Max Gladstone
Series: Craft #1
Publisher: Tor
Copyright: October 2012
ISBN: 1-4668-0203-0
Format: Kindle
Pages: 336
Tara Abernathy was a student in the Hidden Schools, learning Craft, until she was expelled. Literally expelled: thrown from the floating schools to crash painfully to earth in the Badlands, left to return to her family and village and a life of small workings of Craft and contracts on behalf of local farmers. She had largely resigned herself to that life until raiders started killing people. Tara is not the sort of person who could stand by and watch that, or someone to refrain from using Craft to fix the world. The result was undead guardians for the town, perhaps unwisely formed from the town's risen dead, and only a job offer saves Tara from the ungrateful attention of her neighbors. That's how Tara finds herself employed by the firm of Kelethras, Albrecht, and Ao, in the person of partner Elayne Kevarian. Provisionally, depending on her performance on their job: the investigation of the death of a god. It's possible to call Three Parts Dead urban fantasy if you squint at it the right way. It is fantasy that takes place largely in cities, it features the investigation of a crime (and, before long, several crimes), and Tara's attitude is reminscent of an urban fantasy heroine. But this is considerably different from the normal fare of supernatural creatures. In this world, magic, called Craft, is an occupation that requires a great deal of precision and careful construction. Small workings are described similar to magic, although with an emphasis on metaphor. Larger workings more often come in the form of energy flows, contracts, and careful hedging, and the large Craft firms bear more resemblence to mergers and acquisitions specialists than to schools of wizards. This means that the murder investigation of the god of Alt Coulumb involves a compelling mix of danger, magic, highly unusual library investigations, forensic accounting, hidden Craft machinery, unexpected political alliances, and an inhuman police force. Rather than the typical urban fantasy approach of being beaten up until the resolution of the mystery becomes obvious, Tara and her companions do quite a lot of footwork and uncover a more complex political situation than they were expecting. And, in keeping with this take on magic, the story culminates in a courtroom drama (of a sort). I really enjoyed this. It combines the stylistic elements of urban fantasy that I like with some complex and original world-building and a great take on magical contracts. I prefer worlds like this one, where any source of power people have lived with for a long time is surrounded by the controls, formal analysis, and politics that humans create around anything of value. Tara is also a great protagonist. This is a coming of age story in a sense, and Tara is sometimes unsure of her abilities, but it's refreshingly devoid of worry or angst over new-found abilities. Tara enjoys her work, and approaches it with a well-written mix of uncertainty, impulsiveness, and self-confidence (sometimes warranted, sometimes not). I've read some good stories where the protagonist gets dragged into the story against their will, and some of them are quite good, but it's refreshing to read a book about someone who takes to the story like a duck to water. This is a believable protrayal of a character with a lot of native ability and intelligence, not much wisdom (yet), but a lot of thoughtful enthusiasm. I was disappointed to learn that she isn't the protagonist of the next book in the series. The biggest flaw I found in this book is that Gladstone doesn't stick reliably to his world conception. At times, Craft collapses into something more like typical fantasy magical battles, instead of legal procedure and contract made concrete. I suppose this makes parts of the book more exciting, but I would have preferred a plot resolution that involved less combat and more argument. This isn't helped by the utterly hissable villain. There's a lot of complexity in understanding what happened and who was going to benefit (and how), but there is absolutely no doubt who the enemy is, and he's essentially without redeeming qualities. I would have preferred more nuance, given how satisfyingly complex the rest of the world-building is. Three Parts Dead also occasionally suffers from the typical first novel problem of being a bit overstuffed. The world-building comes fast and thick, and nearly everything Tara does involves introducing new concepts. But the world does have a coherent history, and quite a lot of it. It used to be a more typical fantasy world ruled by gods, each with their own territory and worshippers (and Alt Coulumb is a throwback to this era), but an epic war between gods and Craft is in Tara's past, leading to the defeat or destruction of many of the gods. She lives in a time of uneasy truce between human and inhuman powers, featuring some very complex political and economic alliances. There's a lot of material here for an ongoing series. This is a great first novel. It's not without its flaws, but I enjoyed it from beginning to end, and will definitely keep reading the series. Recommended. Followed by Two Serpents Rise. Rating: 8 out of 10

6 June 2014

Gunnar Wolf: What defines an identity?

I must echo John Sullivan's post: GPG keysigning and government identification. John states some very important reasons for people everywhere to verify the identities of those parties they sign GPG keys with in a meaningful way, and that means, not just trusting government-issued IDs. As he says, It's not the Web of Amateur ID Checking. And I'll take the opportunity to expand, based on what some of us saw in Debian, on what this means. I know most people (even most people involved in Free Software development not everybody needs to join a globally-distributed, thousand-people-strong project such as Debian) are not that much into GPG, trust keyrings, or understand the value of a strong set of cross-signatures. I know many people have never been part of a key-signing party. I have been to several. And it was a very interesting experience. Fun, at the beginning at least, but quite tiring at the end. I was part of what could very well constitute the largest KSP ever in DebConf5 (Finland, 2005). Quite awe-inspiring We were over 200 people, all lined up with a printed list on one hand, our passport (or ID card for EU citizens) in the other. Actwally, we stood face to face, in a ribbon-like ring. And, after the basic explanation was given, it was time to check ID documents. And so it began. The rationale of this ring is that every person who signed up for the KSP would verify each of the others' identities. Were anything fishy to happen, somebody would surely raise a voice of alert. Of course, the interaction between every two people had to be quick More like a game than like a real check. "Hi, I'm #142 on the list. I checked, my ID is OK and my fingerprint is OK." "OK, I'm #35, I also printed the document and checked both my ID and my fingerprint are OK." The passport changes hands, the person in front of me takes the unique opportunity to look at a Mexican passport while I look at a Somewhere-y one. And all is fine and dandy. The first interactions do include some chatter while we grab up speed, so maybe a minute is spent Later on, we all get a bit tired, and things speed up a bit. But anyway, we were close to 200 people That means we surely spent over 120 minutes (2 full hours) checking ID documents. Of course, not all of the time under ideal lighting conditions. After two hours, nobody was checking anything anymore. But yes, as a group where we trust each other more than most social groups I have ever met, we did trust on others raising the alarm were anything fishy to happen. And we all finished happy and got home with a bucketload of signatures on. Yay! One year later, DebConf happened in Mexico. My friend Martin Krafft tested the system, perhaps cheerful and playful in his intent but the flaw in key signing parties such as the one I described he unveiled was huge: People join the KSP just because it's a social ritual, without putting any thought or judgement in it. And, by doing so, we ended up dilluting instead of strengthening our web of trust. Martin identified himself using an official-looking ID. According to his recount of the facts, he did start presenting a German ID and later switched to this other document. We could say it was a real ID from a fake country, or that it was a fake ID. It is up to each person to judge. But anyway, Martin brought his Transnational Republic ID document, and many tens of people agreed to sign his key based on it Or rather, based on it plus his outgoing, friendly personality. I did, at least, know perfectly well who he was, after knowing him for three years already. Many among us also did. Until he reached a very dilligent person, Manoj, that got disgusted by this experiment and loudly denounced it. Right, Manoj is known to have strong views, and using fake IDs is (or, at least, was) outside his definition of fair play. Some time after DebConf, a huge thread erupted questioning Martin's actions, as well as questioning what do we trust when we sign an identity document (a GPG key). So... We continued having traditional key signing parties for a couple of years, although more carefully and with more buzz regarding these issues. Until we finally decided to switch the protocol to a better one: One that ensures we do get some more talk and inter-personal recognition. We don't need everybody to cross-sign with everyone else A better trust comes from people chatting with each other and being able to actually pin-point who a person is, what do they do. And yes, at KSPs most people still require ID documents in order to cross-sign. Now... What do I think about this? First of all, if we have not ever talked for at least enough time for me to recognize you, don't be surprised: I won't sign your key or request you to sign mine (and note, I have quite a bad memory when it comes to faces and names). If it's the first conference (or social ocassion) we come together, I will most likely not look for key exchanges either. My personal way of verifying identities is by knowing the other person. So, no, I won't trust a government-issued ID. I know I will be signing some people based on something other than their name, but hey I know many people already who live pseudonymously, and if they choose for whatever reason to forgo their original name, their original name should not mean anything to me either. I know them by their pseudonym, and based on that pseudonym I will sign their identities. But... *sigh*, this post turned out quite long, and I'm not yet getting anywhere ;-) But what this means in the end is: We must stop and think what do we mean when we exchange signatures. We are not validating a person's worth. We are not validating that a government believes who they claim to be. We are validating we trust them to be identified with the (name,mail,affiliation) they are presenting us. And yes, our signature is much more than just a social rite It is a binding document. I don't know if a GPG signature is legally binding anywhere (I'm tempted to believe it is, as most jurisdictions do accept digital signatures, and the procedure is mathematically sound and criptographically strong), but it does have a high value for our project, and for many other projects in the Free Software world. So, wrapping up, I will also invite (just like John did) you to read the E-mail self-defense guide, published by the FSF in honor of today's Reset The Net effort.

13 April 2014

Jeff Licquia: My Heart Bleeds (or, What s Going On With Heartbleed)

[en] One of the big news stories of the week has been the Heartbleed bug . If you know a techie person, you might have noticed that person looking a bit more stressed and tired than usual since Monday (that was certainly true of me). Some of the discussion might seem a bit confusing and/or scary; what s worse, the non-tech press has started getting some of the details wrong and scare-mongering for readers. So here s my non-techie guide to what all the fuss is about. If you re a techie, this advice isn t for you; chances are, you already know what you should be doing to help fix this. (If you re a techie and you don t know, ask! You might just need a little education on what needs to happen, and there s nothing wrong with that, but you ll be better off asking and possibly looking foolish than you will be if you get hacked.) If you re not inclined to read the whole thing, here are the important points:
  • Don t panic! There are reports of people cleaning out their bank accounts, cutting off their Internet service, buying new computers, etc. If you re thinking about doing anything drastic because you re scared of Heartbleed, don t.
  • You ll probably need to change a lot of your passwords on various sites, but wait until each site you use tells you to.
  • This is mostly a problem for site servers, not PCs or phones or tablets. Unless you re doing something unusual (and you d know if you were), you re fine as long as you update your devices like you usually do. (You do update your devices, right?)
So what happened? There s a notion called a heartbeat signal , where two computers talking to each other say Hey, you there? every so often. This is usually done by computer #1 sending some bit of data to computer #2, and computer #2 sending it back. In this particular situation, the two computers actually send both a bit of data and the length of that bit of data. Some of you might be asking so what happens if computer #1 sends a little bit of data, but lies and says the data is a lot longer than that? In a perfect world, computer #2 would scold computer #1 for lying, and that s what happens now with the bug fix. But before early this week, computer #2 would just trust computer #1 in one very specific case. Now, computers use memory to keep track of stuff they re working on, and they re constantly asking for memory and then giving it back when they re done, so it can be used by something else. So, when you ask for memory, the bit of memory you get might have the results of what the program was doing just a moment ago things like decrypting a credit card using a crypto key, or checking a password. This isn t normally a problem, since it s the same program getting its own memory back. But if it s using this memory to keep track of these heartbeats, and it s been tricked into thinking it needs to send back the word HAT, which is 500 characters long , then character 4 and following is likely to be memory used for something just a moment ago. Most of that recycled memory would be undecipherable junk. But credit cards, crypto keys, and passwords tend to be fairly easy to pick out, unfortunately. And that, by the way, is where the name comes from: the heartbeat signal bleeds data, so Heartbleed . There s been some fascinating commentary on how well this bug has been marketed, by the way; hopefully, we in the techie community will learn something about how to explain problems like this for future incidents. Does this affect every site? No. Only sites using certain newer versions of crypographic software called OpenSSL are affected by this. OpenSSL is very popular; I ve seen estimates that anywhere from a third to a half of all secure Internet sites use it. But not all of those sites will have the bug, since it was only introduced in the last two years. How do we know this? OpenSSL is open source, and is developed in public . Because of that, we know the exact moment when the bug was introduced, when it was released to the world, and when it was fixed. (And, just for the record, it was an honest mistake. Don t go and slam on the poor guy who wrote the code with the bug. It should have been caught by a number of different people, and none of them noticed it, so it s a lot more complicated than it s his fault! pitchforks and torches! ) What should I do? Nothing, yet. Right now, this is mostly a techie problem. Remember that bit about crypto keys? That s the part which puts the little lock icon next to the URL in your browser when you go to your bank s Web site, or to Amazon to buy things, or whatever. The crypto keys make sure that your conversation with your bank about your balance is just between you and your bank. That s also the part which is making techies the world over a little more stressed and tired. You see, we know that the people who found the bug were good guys and helped to get the bug fixed, but we don t know if any bad guys found the bug before this week. And if a bad guy used the bug to extract crypto keys, they would still have those crypto keys, and could still use them even though the original bug is fixed. That would mean that a bad guy could intercept your conversation with your bank / Amazon / whoever. Since we don t know, we have to do the safe thing, and assume that all our keys were in fact stolen, That means we have to redo all our crypto keys. That s a lot of work. And because your password is likely protected with those same crypto keys, if a bad guy has Amazon s key, they d be able to watch you change your password at Amazon. Maybe they didn t even have your old password, but now they have your new one. Oops. You re now less secure than you were. Now, it s important to make sure we re clear: we don t know that this has happened. There s really no way of knowing, short of actually catching a bad guy in the act, and we haven t caught anyone yet. So, this is a safety measure. Thus, the best thing to do is: don t panic. Continue to live life as usual. It might be prudent to put off doing some things for a few days, but I wouldn t even worry so much about that. If you pay your bills online, for example, don t risk paying a bill late out of fear. Remember: so far, we have no evidence yet that anyone s actually doing anything malicious with this bug. At some point, a lot of sites are going to post a notice that looks a lot like this:
We highly recommend our users change the password on their Linux Foundation ID which is used for the logins on most Linux Foundation sites, including our community site, Linux.com for your own security and as part of your own comprehensive effort to update and secure as many of your online credentials as you can.
(That s the notice my employer posted once we had our site in order.) That will be your cue that they ve done the work to redo their crypto keys, and that it s now safe to change your password. A lot of sites will make statements saying, essentially, we don t have a problem . They re probably right. Don t second-guess them; just exhale, slowly, and tick that site off your list of things to worry about. Other sites might not say anything. That s the most worrying part, because it s hard to tell if they re OK or not. If it s an important site to you, the best course of action might be to just ask, or search on Google / Bing / DuckDuckGo / wherever for some kind of statement. What about your site? Yup, I use OpenSSL, and I was vulnerable. But I m the only person who actually logs in to anything on this site. I ve got the bugfix, but I m still in the process of creating new keys. Part of the problem is that everyone else is out there creating new keys at the same time, which creates a bit of a traffic jam. So yeah, if you were thinking of posting your credit card number in a comment, and wanted to make sure you did it securely well, don t do that. EVER. And not because of Heartbleed.

3 April 2014

Gunnar Wolf: DrupalCamp Mexico City: April 23-25

We are organizing a DrupalCamp in Mexico City! DrupalCamp Mexico City logo As a Drupal user, I have so far attended two DrupalCamps (one in Guadalajara, Mexico, and one in Guatemala, Guatemala). They are as Free Software conferences usually are great, informal settings where many like-minded users and developers meet and exchange all kinds of contacts, information, and have a good time. Torre de Ingenier a Torre de Ingenier a, UNAM This year, I am a (minor) part of the organizing team. DrupalCamp will be held in Torre de Ingenier a, UNAM Just by Facultad de Ingenier a, where I teach. A modern, beautiful building in Ciudad Universitaria. Talks, tracks So, who is this for? You can go look at the accepted sessions, you will find there is a lot of ground. Starting from the very introduction to how Drupal is structured and some tips on how to work with it (delivered by yours truly), through workflows for specific needs, to strong development-oriented talks. The talks are structured along four tracks: "Training", "Theming", "Development", "Business" and "SymfonyDay". "SymfonyDay"? Yes. Drupal is a fast-evolving Free Software project. Most users are currently using versions 6 and 7, which are as different between each other as day and night... But the upcoming Drupal 8 brings even greater changes. One of the most interesting changes I can see is that Drupal will now be based on a full MVC framework, Symfony. One of the days of our DrupalCamp will be devoted to Symfony (dubbed the Symfony Day). ...And... Again, just look at the list of talks. You will find a great amount of speakers interested in coming here. Not just from Mexico City. Not just from Mexico. Not just from Latin America. I must say I am personally impressed. Sponsors! Of course, as with any volunteer-run conferences: We are still looking for sponsors. We believe being a DrupalCamp sponsor will greatly increase your brand visibility in the community you want to work with. There are still a lot of expenses to cover to make this into all that we want. And surely, you want to be a part of this great project. There are many sponsor levels Surely you can be part of it!
AttachmentSize
drupalcampmx.png57.81 KB
torre_ing.jpg90.22 KB
torre_ing_mini.jpg21.64 KB

25 March 2014

Petter Reinholdtsen: Public Trusted Timestamping services for everyone

Did you ever need to store logs or other files in a way that would allow it to be used as evidence in court, and needed a way to demonstrate without reasonable doubt that the file had not been changed since it was created? Or, did you ever need to document that a given document was received at some point in time, like some archived document or the answer to an exam, and not changed after it was received? The problem in these settings is to remove the need to trust yourself and your computers, while still being able to prove that a file is the same as it was at some given time in the past. A solution to these problems is to have a trusted third party "stamp" the document and verify that at some given time the document looked a given way. Such notarius service have been around for thousands of years, and its digital equivalent is called a trusted timestamping service. The Internet Engineering Task Force standardised how such service could work a few years ago as RFC 3161. The mechanism is simple. Create a hash of the file in question, send it to a trusted third party which add a time stamp to the hash and sign the result with its private key, and send back the signed hash + timestamp. Both email, FTP and HTTP can be used to request such signature, depending on what is provided by the service used. Anyone with the document and the signature can then verify that the document matches the signature by creating their own hash and checking the signature using the trusted third party public key. There are several commercial services around providing such timestamping. A quick search for "rfc 3161 service" pointed me to at least DigiStamp, Quo Vadis, Global Sign and Global Trust Finder. The system work as long as the private key of the trusted third party is not compromised. But as far as I can tell, there are very few public trusted timestamp services available for everyone. I've been looking for one for a while now. But yesterday I found one over at Deutches Forschungsnetz mentioned in a blog by David M ller. I then found a good recipe on how to use the service over at the University of Greifswald. The OpenSSL library contain both server and tools to use and set up your own signing service. See the ts(1SSL), tsget(1SSL) manual pages for more details. The following shell script demonstrate how to extract a signed timestamp for any file on the disk in a Debian environment:
#!/bin/sh
set -e
url="http://zeitstempel.dfn.de"
caurl="https://pki.pca.dfn.de/global-services-ca/pub/cacert/chain.txt"
reqfile=$(mktemp -t tmp.XXXXXXXXXX.tsq)
resfile=$(mktemp -t tmp.XXXXXXXXXX.tsr)
cafile=chain.txt
if [ ! -f $cafile ] ; then
    wget -O $cafile "$caurl"
fi
openssl ts -query -data "$1" -cert   tee "$reqfile" \
      /usr/lib/ssl/misc/tsget -h "$url" -o "$resfile"
openssl ts -reply -in "$resfile" -text 1>&2
openssl ts -verify -data "$1" -in "$resfile" -CAfile "$cafile" 1>&2
base64 < "$resfile"
rm "$reqfile" "$resfile"
The argument to the script is the file to timestamp, and the output is a base64 encoded version of the signature to STDOUT and details about the signature to STDERR. Note that due to a bug in the tsget script, you might need to modify the included script and remove the last line. Or just write your own HTTP uploader using curl. :) Now you too can prove and verify that files have not been changed. But the Internet need more public trusted timestamp services. Perhaps something for Uninett or my work place the University of Oslo to set up?

10 March 2014

Simon Kainz: DUCK for packagers

I finally managed to package duck 0.4 for Debian/unstable. Additionally to duck.debian.net you can now run all (in fact even more than on the website) checks on your own. Intended as as helper tool for packaging work, it inspects debian/control files as well as upstream metadata files. By trying to use the approriate tool for each URL, it tries to find out (eg. using git ls-remote for GIT repos) if a given repository is availabe. Email addresses in Maintainer and Uploaders fields as well as mailto: URLs are checked by trying to find MX, A or AAAA records for the email domain. Usage Without any additional options, duck looks for the following files: Just hop into your extracted package source tree (the one containing the debian subdir), and run:
$ duck
If everything is ok, you won't get any output. To see what's going on, run:
$ duck -v
debian/control: Maintainer: Simon Kainz simon@familiekainz.at: OK
debian/control: Vcs-Git: git://anonscm.debian.org/collab-maint/duck.git: OK
debian/control: Vcs-Browser: http://anonscm.debian.org/gitweb/?p=collab-maint/duck.git: OK
debian/control: Homepage: http://duck.debian.net: OK
Errors show up like this:
$ duck -v
debian/control: Maintainer: Simon Kainz simon@domain.invalid: ERROR
Simon Kainz simon@domain.invalid: No MX entry found.
Simon Kainz simon@domain.invalid: No A entry found.
Simon Kainz simon@domain.invalid: No AAAA entry found.
debian/control: Vcs-Git: git://anonscm.insanetypo..debian.org/collab-maint/duck.git: ERROR
fatal: unable to connect to anonscm.insanetypo..debian.org:
anonscm.insanetypo..debian.org: Name or service not known
debian/control: Vcs-Browser: http://anonscm.debian.org/gitweb/?p=collab-maint/duck.git: OK
debian/control: Homepage: http://duck.debian.net: OK
skainz@zidpc9027:~/debian/sid/home/skainz/packages/duck_2014/duck$ 
Missing features Currently the following VCS URLs are not supported: If someone knows how to get the state of one of the repos based on the VCSs above, preferably without needing to check out the complete source code, please contact me. And remember: "Always duck before dput" :-)

21 February 2014

Jakub Wilk: For those who care about snowclones

Instances of the for those who care about X snowclone on Debian mailing lists:

12 February 2014

Mario Lang: Roughly 1500 source packages have possibly broken links in debian/control

There are currently roughly 1500 source packages in Debian which possibly (very likely actually) do have broken URLs in debian/control. While it is quite useful that we have VCS information and Homepage URLs in the Packages file these days, we also created a rather big source of bitrot. These URLs are typically paste-and-forget. Sure, people occasionally catch the fact that a homepage or VCS has moved, especially if they are active and in good contact with their upstreams. However, there are also other cases...
DUCK to the rescue! My coworker Simon Kainz has worked on a service that helps at least to track which URLs are currently broken. We've initially discussed some way to make this a part of Lintian, which is what I would have prefered. However, for good reasons, Lintian doesn't want to call out to the net by default, so these checks would likely not get run by many developers anyway. So Simon ended up creating DUCK - the Debian Url ChecKer which actually goes out to the net and verifies that all the Vcs-* and Homepage fields of debian/control are actually reachable. The frontend allows developers to search for packages they maintain, to quickly see if they have any URLs which are possibly broken. There is a slight chance of temporary network problems of course, so what DUCK does is to show the status of checks in the last few days, so that you quickly see if you are dealing with a typical false positive. First of all, thanks to Simon! I think DUCK is an excellent project for a future NM candidate. I actually already wanted to advocate him for DD, but we ended up on a website which suggested that new contributors should start by applying for DM these days, and only later go for DD. I find that actually quite strange, especially in Simons case, but well, we did not feel like argueing. Secondly, and thats also very important: whats needed to improve the overall quality of URLs in the package system is your attention! You can easily search for email addresses of maintainers or uploaders. Team members can create a bookmark entry that checks for problems in all packages maintained by the teams they are a member of. You just need to actually visit these pages from time to time. It would probably not be well received if we filed all these bugs at once :-). So we need you to care, since we don't want to generate too much noise regarding "just these broken URLs". Or are they? If your vcs fields are broken, debcheckout will not work properly. Which defeats the purpose of debcheckout. If your homepage URL is broken, packages.d.o will also have a wrong link. 1500 packages with broken URLs. Don't you think we can do better then that?

7 February 2014

Jo Shields: Dear Fake Debian Developers, shoo.

Another post about the Valve/Collabora free games thing. This time, the bad bit people trying to scam free games from us. Before I start, I want to make one thing clear there are people who have requested keys who don t meet the criteria, but are honest and legitimate in their requests. This blogspam is not about them at all. If you re in that category, you re not being complained about. So. Some numbers. At time of writing, I ve assigned keys to 279 Debian Developers or Debian Maintainers almost 25% of the total eligible pool of about 1200. I ve denied 22 requests. Of these 10 were polite requests from people who didn t meet the conditions stated (e.g. Ubuntu developers rather than Debian). These folks weren t at all a problem for us, and I explained politely that they didn t meet the terms we had agreed at the time with Valve. No problem at all with those folks. Next, we have the chancers, 7 of them, who pretended to be eligible when they clearly weren t. For example, two people sent me signed requests pointing to their entry on the Debian New Maintainers page when challenged over the key not being in the keyring. The NM page showed that they had registered as non-uploading Debian Contributors a couple of hours previously. A few just claimed I am a DD, here is my signature when they weren t DDs at all. Those requests were also binned.
Papers, Please screenshot - denied entry application

DENIED

And then we move onto the final category. These people tried identity theft, but did a terrible job of it. There were 5 people in this category:
From: Xxxxxxxx Xxxxxx <xxxxxxxx.xxxxxx@ieee.org>
Subject: free subscription to Debian Developer
8217 A205 5E57 043B 2883 054E 7F55 BB12 A40F 862E
This is not a signature, it s a fingerprint. Amusingly, it s not the fingerprint for the person who sent my mail, but that of Neil McGovern a co-worker at Collabora. Neil assured me he knew how to sign a mail properly, so I shitcanned that entry.
From: "Xxxxx, Xxxxxxxxx" <x.xxxxx@bbw-bremen.de>
Subject: Incoming!
Hey dude,
I want to have the redemption code you are offering for the Valve Games
mQGiBEVhrscRBAD4M5+qxhZUD67PIz0JeoJ0vB0hsLE6QPV144PLjLZOzHbl4H3N
...snip...
Lz8An1TEmmq7fltTpQ+Y1oWhnE8WhVeQAKCzh3MBoNd4AIGHcVDzv0N0k+bKZQ=3D=3D
=3Du/4R
Wat? Learn to GPG!
From: Xxxxxx-Xxxx Le Xxxxxxx Xxxx <xx.xxxxxxxxx@gmail.com>
Subject: pass steam
Hey me voila
Merci beaucoup
valve
2069 1DFC C2C9 8C47 9529 84EE 0001 8C22 381A 7594
Like the first one, a fingerprint. This one is for S bastien Villemot. Don t scammers know how to GPG sign?
From: "Xxxxxxxxx Xxxxxxx" <xxxxxxxx@web.de>
Subject: thanks /DD/Steam gifts us finally something back
0x6864730DF095E5E4
Yet again, a fingerprint. This one is for Marco Nenciarini. I found this request particularly offensive due to the subject line the haughty tone from an identity thief struck me as astonishingly impertinent. Still, when will scammers learn to GPG?
From: Sven Hoexter <svenhoexter@gmail.com>
Subject: Valve produced games
I'm would like to get the valve produced games
My keyring: 0xA6DC24D9DA2493D1 Sven Hoexter <hoexter> sig:6
Easily the best scam effort, since this is the only one which both a) registered an email address under the name of a DD, and b) used a fingerprint which actually corresponds to that human. Sadly for the scammer, I m a suspicious kind of person, so my instinct was to verify the claim via IRC.
31-01-2014 16:52:48 > directhex: Hoaxter, have you started using gmail without updating your GPG key? (note: significantly more likely is someone trying to steal your identity a little to steal valve keys from collabora)
31-01-2014 16:54:51 < Hoaxter!~sh@duckpond6.stormbind.net: directhex: I do not use any Google services and did not change my key
So yeah. Nice try, scammer. I m not listing, in all of this, the mails which Neil received from people who didn t actually read his mail to d-d-a. I m also not listing a story which I ve only heard second ha actually no, this one is too good not to share. Someone went onto db.debian.org, did a search for every DD in France, and emailed every Jabber JID (since they look like email addresses) asking them to forward unwanted keys. All in all, the number of evildoers is quite low, relative to the number of legitimate claims 12 baddies to 279 legitimate keys issued. But still, this is why the whole key issuing thing has been taking me so long and why I have the convoluted signature-based validation system in place. Enjoy your keys, all 279 of you (or more by the time some of you read this). The offer has no explicit expiry on it Valve will keep issuing keys as long as there is reason to, and Collabora will continue to administer their allocation as long as they remain our clients. It s a joint gift to the community thousands of dollars worth of games from Valve, and a significant amount of my time to administer them from Collabora.

Next.

Previous.